Elad Gil writes There have been multiple call to regulate AI. It is too early to do so

Calls for regulation are:

self serving for the parties asking for it (it is not surprising the main incumbents say regulation is good for AI, as it will lock in their incumbency). Some notable counterexamples also exist where we should likely regulate things related to AI, but these are few and far between (e.g. export of advanced chip technology to foreign adversaries is a notable one).


Vox Aug 2023 The AI rules that US policymakers are considering, explained including the Algorithmic Accountability Act (FTC regulating claims made by GPTs), mandatory “safety audits”, licensing requirements, etc.

De Jure Regulation

Politico (March 2023) reports that Europe’s original plan to bring AI under control is no match for the technology’s new, shiny chatbot application.:

working to impose stricter requirements on both developers and users of ChatGPT and similar AI models, including managing the risk of the technology and being transparent about its workings. They are also trying to slap tougher restrictions on large service providers while keeping a lighter-tough regime for everyday users playing around with the technology.

De Facto Regulation

Several big companies are reminding their employees not to enter confidential information into ChatGPT:

JPMorgan joins Amazon, Verizon and Accenture in banning staff from using the chatbot.

Also see AI Plagiarism